Copilot - Jailbreak Attempt Detected

Browse: 🏠 · Solutions · Connectors · Methods · Tables · Content · Parsers · ASIM Parsers · ASIM Products · 📊

Back to Content Index


Detects jailbreak attempts in Copilot interactions where users are trying to bypass Copilot guardrails and security controls. This rule identifies prompt injection and LLM abuse scenarios that could lead to initial access, credential access, or system impact.

Attribute Value
Type Analytic Rule
Solution Microsoft Copilot
ID e5f6a7b8-c9d0-41e2-f3a4-b5c6d7e8f9a0
Severity High
Status Available
Kind Scheduled
Tactics InitialAccess, CredentialAccess, Impact
Techniques T1078, T1110, T1565
Required Connectors MicrosoftCopilot
Source View on GitHub

Tables Used

This content item queries data from the following tables:

Table Transformations Ingestion API Lake-Only
CopilotActivity ?

Browse: 🏠 · Solutions · Connectors · Methods · Tables · Content · Parsers · ASIM Parsers · ASIM Products · 📊

Back to Analytic Rules · Back to Microsoft Copilot